Bayes-Optimal Convolutional AMP

نویسندگان

چکیده

This paper proposes Bayes-optimal convolutional approximate message-passing (CAMP) for signal recovery in compressed sensing. CAMP uses the same low-complexity matched filter (MF) interference suppression as (AMP). To improve convergence property of AMP ill-conditioned sensing matrices, so-called Onsager correction term is replaced by a convolution all preceding messages. The tap coefficients are determined so to realize asymptotic Gaussianity estimation errors via state evolution (SE) under assumption orthogonally invariant matrices. An SE equation derived optimize sequence denoisers CAMP. optimized proved be matrices if converges fixed-point and unique. For with low-to-moderate condition numbers, can achieve performance high-complexity orthogonal/vector that requires linear minimum mean-square error (LMMSE) instead MF.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayes-Optimal Chemotaxis

Chemotaxis plays a crucial role in many biological processes, including nervous system development. However, fundamental physical constraints limit the ability of a small sensing device such as a cell or growth cone to detect an external chemical gradient. One of these is the stochastic nature of receptor binding, leading to a constantly fluctuating binding pattern across the cell's array of re...

متن کامل

Restricted Bayes Optimal Classifiers

We introduce the notion of restricted Bayes optimal classifiers. These classifiers attempt to combine the flexibility of the generative approach to classification with the high accuracy associated with discriminative learning. They first create a model of the joint distribution over class labels and features. Instead of choosing the decision boundary induced directly from the model, they restri...

متن کامل

Bayes Optimal Instance-Based Learning

In this paper we present a probabilistic formalization of the instance-based learning approach. In our Bayesian framework, moving from the construction of an explicit hypothesis to a data-driven instance-based learning approach, is equivalent to averaging over all the (possibly innnitely many) individual models. The general Bayesian instance-based learning framework described in this paper can ...

متن کامل

Learning Optimal Augmented Bayes Networks

Naive Bayes is a simple Bayesian classifier with strong independence assumptions among the attributes. This classifier, despite its strong independence assumptions, often performs well in practice. It is believed that relaxing the independence assumptions of a naive Bayes classifier may improve the classification accuracy of the resulting structure. While finding an optimal unconstrained Bayesi...

متن کامل

When Naı̈ve Bayes Nearest Neighbors Meet Convolutional Neural Networks

Since Convolutional Neural Networks (CNNs) have become the leading learning paradigm in visual recognition, Naive Bayes Nearest Neighbor (NBNN)-based classifiers have lost momentum in the community. This is because (1) such algorithms cannot use CNN activations as input features; (2) they cannot be used as final layer of CNN architectures for end-to-end training , and (3) they are generally not...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2021

ISSN: ['0018-9448', '1557-9654']

DOI: https://doi.org/10.1109/tit.2021.3077471